BTCC / BTCC Square / Global Cryptocurrency /
Study Reveals AI Models Vulnerable to Poisoning with Minimal Malicious Data

Study Reveals AI Models Vulnerable to Poisoning with Minimal Malicious Data

Published:
2025-10-13 17:33:02
7
1
BTCCSquare news:

New research demonstrates that AI models can be compromised with as few as 250 poisoned documents, regardless of model size. The study, conducted by a consortium including Anthropic and the UK AI Security Institute, overturns the assumption that data poisoning requires control over a significant percentage of a training dataset. Attack success hinges solely on the number of malicious samples injected during training.

Models ranging from 600 million to 13 billion parameters proved equally susceptible. Even when trained on billions of clean data points, these backdoors persisted—though clean retraining could partially mitigate the damage. The findings highlight critical vulnerabilities in systems relying on public web scraping for training data.

|Square

Get the BTCC app to start your crypto journey

Get started today Scan to join our 100M+ users

All articles reposted on this platform are sourced from public networks and are intended solely for the purpose of disseminating industry information. They do not represent any official stance of BTCC. All intellectual property rights belong to their original authors. If you believe any content infringes upon your rights or is suspected of copyright violation, please contact us at [email protected]. We will address the matter promptly and in accordance with applicable laws.BTCC makes no explicit or implied warranties regarding the accuracy, timeliness, or completeness of the republished information and assumes no direct or indirect liability for any consequences arising from reliance on such content. All materials are provided for industry research reference only and shall not be construed as investment, legal, or business advice. BTCC bears no legal responsibility for any actions taken based on the content provided herein.